Not that far of from the GeForce 6800 Ultra. It's main differences are 75MHz lower core-clock, 50MHz lower RAM clock and it only uses one molex connection for extra power. Nevertheless the 6800GT was considered high-end (just about as good as the fastest nVidia card out there, just one slight step slower) in it's day and was very much capable of running games at a decent framerate. It's main competitor was the ATi X800-series. In the benchmark you can see the X800 XT PE (only the X850 XT PE was faster) running ahead on the (chepaer) 6800GT and often falling behind on the 6800 Ultra. Just like the GeForce FX-era both companies kept releasing cards that tried to outrun the competition by a small margin. > Read more
A budget card with an interesting twist: this card boasts all the ingredients for a budget card (single molex, DDR memory rather than GDDR3, lower clocks, less pipelines, D-Sub connector) but basically it's fitted with the same chip as the GeForce 6800 GT and Ultra (NV40 chips)! Of course this sounds better than it really is, as the 6800LE cards have partly defective NV40-chips which work fine when a few defective pipelines are disabled. Some cards had nearly perfect chips, however.
Back in the old days a buddy of mine bought an ASUS V9999TD/LE which is a GeForce 6800 LE with 1,2 volts on the core (the other brands had 1,1 volts and GT / Ultra cards work with 1,3 or 1,4 volts). Due to the increased voltage the card could overclock a bit more easily. All the pipelines were unlockable (from 8 to 16) and the GPU clocked amazingly well. Eventually we installed his card in my PC (Athlon XP 1800+ running at ~2,6GHz with nForce2 platform) which was not bad but slower than our competitors with 3GHz Athlon64's or Pentium-M's. Eventually we reached 6033 3DMark05's and 2689 3DMark06's which was good for #2 in 3DMark05 and #1 place in 3DMark06 (at that time, at least). Not bad considering we did not use volt-mods, did not use LN2 and used a slower CPU .
To compare the scores we got:
Standard 6800LE with 3DMark05: about 2500.
Standard 6800GT with 3DMark05: about 4800.
Our overclocked 6800LE with 3DMark05: 6033.
Unfortunately not every 6800 LE overclocks as well. Some can't open any pipelines at all or are stuck at 12. I also noticed, after a while when newer games were released, quite some reports of people who fully unlocked their card without problems but eventually got some/very few artifacts with new games.
> Read more
Rather than just 'Gainward GeForce 6800 Ultra', Gainward named it 'PowerPack! Ultra/2600 "Golden Sample"' or in short GF6800U GS. This is a slightly overclocked (430MHz/575MHz (DDR1150)) GeForce 6800 Ultra graphics card.
Gainward also released a watercooled version with 450MHz/600MHz (DDR1200) clocks for even more performance and a premium price (899 EUR). > Read more
A review sample from nVidia. It uses early A1-revision hardware and comes with a fancy nVidia aluminum box. This card is pretty much the roots of the final NV40 product. Just before nVidia sent these samples to reviewers a bug popped up regarding the fan-control. Due to this a small modification has been made on the back of the card.
To get a better view of what was going on with the NV40, I've made a time-line. It starts quite early and in the beginning it was unsure which name the NV40 would carry. I think that NV35 (a GeForce FX product) and NV40 were sometimes mixed up in those days.
21 March 2001
This is way back and after 3dfx had been acquired by nVidia. An article pops up that concepts/parts of 3dfx' next generation chip, called Mojo, might be used in the NV30 or NV40.
9 July 2002
Estimation of NV40's power: filtrate of 4 Gigapixel and 600 million polygons per second.
6 September 2002
NV40 is expected to be launched in second half of 2004. It will support DirectX 10 and will have 200 million transistors.
13 June 2003
NV40 will be launched in first quarter of 2004 and will be a high-end card with AGP8x and PCI-Expresss
14 August 2003
IBM will manufacture NV40 chips on 130nm. The chip has 300 million transistors en will be targeted for 600MHz.
13 September 2003
Both nVidia's NV40 and ATi's R420 won't appear in 2003.
14 January 2004
nVidia is ready with the NV40-chip and can deliver these to manufacturers in the end of February.
3 February 2004
GDDR3 memory chips aren't available in time for the NV40 and R420. The new cards might use GDDR2 instead.
6 Februari 2004
Despite that nVidia's first NV40-chip ran at 300MHz they focus on 600MHz. The second revision chips that will be ready in the end of February will reach 450MHz to 550MHz.
23 February 2004
ATi's R420 chip is ready for mass production.
15 ~ 19 March 2004
PCB date of my card.
21 March 2004
Specifications and benchmarks from the new NV40 leak out. The A2-revision (I have A1 on both PCB and GPU) runs at 475MHz with 256MB GDDR3 at 600MHz (DDR1200). As far as I can track down (and putting the cards into a timeframe) this A2-revision card has A2-revision PCB with A1-revision GPU.
1 ~ 2 April 2004
Chipdate of my card.
9 April 2004
The NV40, better known as GeForce 6800 Ultra, will have 16 pipelines just like the Radeon X800 XT. The X800 Pro will have 12 pipelines and the X800 SE get's 8. In the meantime The Inquirer finds out that ForceWare 60.72 reveals that GeForce 6800 Ultra will be the name of the new NV40 chip. It's also being noted that the NV40 will use 150W to 160W of energy.
13 April 2004
Specifications of the GeForce 6800 Ultra are leaked just before the NDA expires. The chip has 222 million transistors and is manufactured at 130nm by IBM. Clock frequencies are 400MHz for the GPU and 550MHz (DDR1100) for the GDDR3 memory. It's noted that nVidia tried to fix the problems of the GeForce FX by giving more attention to the pixel shaders.
14 April 2004
The NDA has expired and reviews pop up. Reviewers use the same card as I have and conclude that the card is a lot faster than the GeForce FX. It turns out that the cooling is quite good and the card uses 120W of energy.
16 April 2004
The X800 Pro will be launched on 4 may 2004.
19 April 2004
nVidia announces the Quadro FX 4000 which is based on the NV40 core.
20 April 2004
Overclocking, noise and energy consumption of the GeForce 6800 Ultra is being tested. It turns out that the card uses only 17W more than the Radeon 9800 XT and GeForce FX 5950 Ultra. > Read more